133 research outputs found
Quantum Limits of Thermometry
The precision of typical thermometers consisting of particles is shot
noise limited, improving as . For high precision thermometry
and thermometric standards this presents an important theoretical noise floor.
Here it is demonstrated that thermometry may be mapped onto the problem of
phase estimation, and using techniques from optimal phase estimation, it
follows that the scaling of the precision of a thermometer may in principle be
improved to , representing a Heisenberg limit to thermometry.Comment: 4 page
Dephasing-assisted Gain and Loss in Mesoscopic Quantum Systems
Motivated by recent experiments, we analyse the phonon-assisted steady-state
gain of a microwave field driving a double quantum-dot in a resonator. We apply
the results of our companion paper, which derives the complete set of
fourth-order Lindblad dissipators using Keldysh methods, to show that resonator
gain and loss are substantially affected by dephasing-assisted dissipative
processes in the quantum-dot system. These additional processes, which go
beyond recently proposed polaronic theories, are in good quantitative agreement
with experimental observationsComment: 5 pages, 3 Figures, published together with arXiv:1608.0416
Fault tolerant quantum computation with very high threshold for loss errors
Many proposals for fault tolerant quantum computation (FTQC) suffer
detectable loss processes. Here we show that topological FTQC schemes, which
are known to have high error thresholds, are also extremely robust against
losses. We demonstrate that these schemes tolerate loss rates up to 24.9%,
determined by bond percolation on a cubic lattice. Our numerical results show
that these schemes retain good performance when loss and computational errors
are simultaneously present.Comment: 4 pages, comments still very welcome. v2 is a reasonable
approximation to the published versio
Detecting itinerant single microwave photons
Single photon detectors are fundamental tools of investigation in quantum
optics and play a central role in measurement theory and quantum informatics.
Photodetectors based on different technologies exist at optical frequencies and
much effort is currently being spent on pushing their efficiencies to meet the
demands coming from the quantum computing and quantum communication proposals.
In the microwave regime however, a single photon detector has remained elusive
although several theoretical proposals have been put forth. In this article, we
review these recent proposals, especially focusing on non-destructive detectors
of propagating microwave photons. These detection schemes using superconducting
artificial atoms can reach detection efficiencies of 90\% with existing
technologies and are ripe for experimental investigations.Comment: 11 pages, 8 figure
Neutron star heating constraints on wave-function collapse models
Spontaneous wavefunction collapse models, like the Continuous Spontaneous
Localization, are designed to suppress macroscopic superpositions, while
preserving microscopic quantum phenomena. An observable consequence of collapse
models is spontaneous heating of massive objects. Here we calculate the
collapse-induced heating rate of astrophysical objects, and the corresponding
equilibrium temperature. We apply these results to neutron stars, the densest
phase of baryonic matter in the universe. Stronger collapse model parameters
imply greater heating, allowing us to derive competitive bounds on model
parameters using neutron star observational data, and to propose speculative
bounds based on the capabilities of current and future astronomical surveys.Comment: v3: minor modifications, close to published version v2: Thanks to a
correspondence with Philip Pearle, we found an error in our previous
calculation shortly after submission. This revision corrects the error, which
significantly changes our conclusions and discussio
The effect of noise correlations on randomized benchmarking
Among the most popular and well studied quantum characterization,
verification and validation techniques is randomized benchmarking (RB), an
important statistical tool used to characterize the performance of physical
logic operations useful in quantum information processing. In this work we
provide a detailed mathematical treatment of the effect of temporal noise
correlations on the outcomes of RB protocols. We provide a fully analytic
framework capturing the accumulation of error in RB expressed in terms of a
three-dimensional random walk in "Pauli space." Using this framework we derive
the probability density function describing RB outcomes (averaged over noise)
for both Markovian and correlated errors, which we show is generally described
by a gamma distribution with shape and scale parameters depending on the
correlation structure. Long temporal correlations impart large nonvanishing
variance and skew in the distribution towards high-fidelity outcomes --
consistent with existing experimental data -- highlighting potential
finite-sampling pitfalls and the divergence of the mean RB outcome from
worst-case errors in the presence of noise correlations. We use the
Filter-transfer function formalism to reveal the underlying reason for these
differences in terms of effective coherent averaging of correlated errors in
certain random sequences. We conclude by commenting on the impact of these
calculations on the utility of single-metric approaches to quantum
characterization, verification, and validation.Comment: Updated and expanded to include full derivation. Related papers
available from http://www.physics.usyd.edu.au/~mbiercuk/Publications.htm
Breaking time-reversal symmetry with a superconducting flux capacitor
We present the design of a passive, on-chip microwave circulator based on a
ring of superconducting tunnel junctions. We investigate two distinct physical
realisations, based on either Josephson junctions (JJ) or quantum phase slip
elements (QPS), with microwave ports coupled either capacitively (JJ) or
inductively (QPS) to the ring structure. A constant bias applied to the center
of the ring provides the symmetry breaking (effective) magnetic field, and no
microwave or rf bias is required. We find that this design offers high
isolation even when taking into account fabrication imperfections and
environmentally induced bias perturbations and find a bandwidth in excess of
500 MHz for realistic device parameters.Comment: 10 pages, 11 figures, including supplementary material - published as
"Passive on-chip, superconducting circulator using rings of tunnel junctions
Fault-tolerance thresholds for the surface code with fabrication errors
The construction of topological error correction codes requires the ability
to fabricate a lattice of physical qubits embedded on a manifold with a
non-trivial topology such that the quantum information is encoded in the global
degrees of freedom (i.e. the topology) of the manifold. However, the
manufacturing of large-scale topological devices will undoubtedly suffer from
fabrication errors---permanent faulty components such as missing physical
qubits or failed entangling gates---introducing permanent defects into the
topology of the lattice and hence significantly reducing the distance of the
code and the quality of the encoded logical qubits. In this work we investigate
how fabrication errors affect the performance of topological codes, using the
surface code as the testbed. A known approach to mitigate defective lattices
involves the use of primitive SWAP gates in a long sequence of syndrome
extraction circuits. Instead, we show that in the presence of fabrication
errors the syndrome can be determined using the supercheck operator approach
and the outcome of the defective gauge stabilizer generators without any
additional computational overhead or the use of SWAP gates. We report numerical
fault-tolerance thresholds in the presence of both qubit fabrication and gate
fabrication errors using a circuit-based noise model and the minimum-weight
perfect matching decoder. Our numerical analysis is most applicable to 2D
chip-based technologies, but the techniques presented here can be readily
extended to other topological architectures. We find that in the presence of 8%
qubit fabrication errors, the surface code can still tolerate a computational
error rate of up to 0.1%.Comment: 10 pages, 15 figure
Nonreciprocal Atomic Scattering: A saturable, quantum Yagi-Uda antenna
Recent theoretical studies of a pair of atoms in a 1D waveguide find that the
system responds asymmetrically to incident fields from opposing directions at
low powers. Since there is no explicit time-reversal symmetry breaking elements
in the device, this has caused some debate. Here we show that the asymmetry
arises from the formation of a quasi-dark-state of the two atoms, which
saturates at extremely low power. In this case the nonlinear saturability
explicitly breaks the assumptions of the Lorentz reciprocity theorem. Moreover,
we show that the statistics of the output field from the driven system can be
explained by a very simple stochastic mirror model and that at steady state,
the two atoms and the local field are driven to an entangled, tripartite
state. Because of this, we argue that the device is
better understood as a saturable Yagi-Uda antenna, a distributed system of
differentially-tuned dipoles that couples asymmetrically to external fields.Comment: 12 pages, 5 Figure
- …